### Short history of computation
##### More Transistors (1950–2000)
For decades, we made computers faster by increasing clock speeds and adding more transistors to CPUs.
However, we eventually hit a physical limit on how many transistors could be packed into a single chip.
##### Multi-Cores and Parallelism
To continue improving performance, we shifted to multiple CPUs and multi-core processors.
However, simply having more CPUs or cores does not automatically speed up applications.
To take advantage of multiple cores, applications must be designed to execute parallel processes—a concept known as parallelism.
##### Specialized Hardware and Parallelism
Beyond general-purpose CPUs, we explored specialized chips optimized for specific tasks.
One of the most successful examples is the GPU (Graphics Processing Unit), originally designed for graphics-intensive tasks like video games, 3D modeling, and video editing.
Later, we realized that parallelizing computations could enable GPUs to accelerate tasks beyond graphics, leading to GPGPU (General-Purpose GPU computing).
This is where NVIDIA introduced CUDA, a programming model that allows developers to leverage GPUs for high-performance computing in fields like AI, scientific simulations, and data processing.
### Parallelism and Concurrency
##### Multi-core parallelism and multi-threaded applications
Multi-core enables hardware-level parallelism where multiple processing cores physically execute instructions simultaneously.
Multi-threading enables application to leverage multiple cores (parallelism on the application level).
##### Time Sharing and concurrency
A computer typically has a single Central Processing Unit (CPU) but seems to run multiple processes simultaneously.
How is this possible with only one CPU? The answer lies in time sharing.
The operating system (OS) allocates CPU time to different processes in rapid succession.
This switching happens so quickly that it creates the illusion of multiple processes running concurrently, ensuring efficient multitasking.
##### Concurrency
Multiple process sharing the same ressource at the same time
##### Parallelism
Multiple process sharing different ressource at the same time.
##### Asynchronous
Is a sort of parallelism